Successive Overrelaxation for Support Vector Regression

نویسندگان

  • Yong Quan
  • Jie Yang
  • Chenzhou Ye
چکیده

Abstract: Training a SVR (support vector regression) requires the solution of a very large QP (quadratic programming) optimization problem. Despite the fact that this type of problem is well understood, the existing training algorithms are very complex and slow. In order to solve these problems, this paper firstly introduces a new way to make SVR have the similar mathematic form as that of a support vector machine. Then a versatile iterative method, successive overrelaxation, is proposed. Experimental results show that this new method converges considerably faster than other methods that require the presence of a substantial amount of data in memory. The results give guidelines for the application of this method to large domains.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The New Development in Support Vector Machine Algorithm Theory and Its Application

As to classification problem, this paper puts forward the combinatorial optimization least squares support vector machine algorithm (COLS-SVM). Based on algorithmic analysis of COLS-SVM and improves on it, the improved COLS-SVM can be used on individual credit evaluation. As to regression problem, appropriate kernel function and parameters were selected based on the analysis of support vector r...

متن کامل

Successive overrelaxation for support vector machines

Successive overrelaxation (SOR) for symmetric linear complementarity problems and quadratic programs is used to train a support vector machine (SVM) for discriminating between the elements of two massive datasets, each with millions of points. Because SOR handles one point at a time, similar to Platt's sequential minimal optimization (SMO) algorithm which handles two constraints at a time and J...

متن کامل

A Fast Bounded Parametric Margin Model for Support Vector Machine

In this paper, a fast bounded parametric margin  -support vector machine (BP- SVM) for classification is proposed. Different from the parametric margin  -support vector machine (par- -SVM), the BP- -SVM maximizes a bounded parametric margin, and consequently the successive overrelaxation (SOR) technique could be used to solve our dual problem as opposed solving the standard quadratic progr...

متن کامل

Supremum-Norm Convergence for Step-Asynchronous Successive Overrelaxation on M-matrices

Step-asynchronous successive overrelaxation updates the values contained in a single vector using the usual Gauß–Seidel-like weighted rule, but arbitrarily mixing old and new values, the only constraint being temporal coherence— you cannot use a value before it has been computed. We show that given a nonnegative real matrix A, a σ ≥ ρ(A) and a vector w > 0 such that Aw ≤ σw, every iteration of ...

متن کامل

Data Discrimination via Nonlinear Generalized Support Vector Machines

The main purpose of this paper is to show that new formulations of support vector machines can generate nonlinear separating surfaces which can discriminate between elements of a given set better than a linear surface. The principal approach used is that of generalized support vector machines (GSVMs) which employ possibly indeenite kernels 17]. The GSVM training procedure is carried out by eith...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2003